Guild icon
Project Sekai
🔒 UMDCTF 2023 / ✅-ml-torchics-revenge
Avatar
Torchic's Revenge - 500 points
Category: Ml Description: Torchic struggled with TensorFlow's complex syntax, but after trying PyTorch, it felt like a bird of prey soaring through the sky. It became a PyTorch fanatic, squawking about it like a birdbrain on caffeine. Author: Segal Download: https://drive.google.com/file/d/1Kj8GQUfMY_EHSI9ifmd-3AtuIVMVdXlO/view Files: No files. Tags: No tags.
Sutx pinned a message to this channel. 04/28/2023 3:00 PM
Avatar
@Violin wants to collaborate 🤝
Avatar
@fleming wants to collaborate 🤝
Avatar
@unpickled admin bot wants to collaborate 🤝
Avatar
@TheBadGod wants to collaborate 🤝
Avatar
unpickled admin bot 04/30/2023 2:41 AM
@TheBadGod its 3 am for me but if you need anything done with torch i can prob do it (edited)
Avatar
i have no idea what to do rn, like i can't load the model
02:42
_pickle.UnpicklingError: A load persistent id instruction was encountered,
Avatar
unpickled admin bot 04/30/2023 2:43 AM
uhhhhhhhhhhh
02:43
send model?
02:43
and what you are doing? (edited)
02:43
there was a .pkl
02:43
right
02:43
can i have that?
02:43
inb4 the organisers put a pickle deserialisation bug for the funnis
Avatar
binwalk should give you a zip
02:43
extract zip
Avatar
unpickled admin bot 04/30/2023 2:43 AM
oh
Avatar
@deuterium wants to collaborate 🤝
Avatar
and a bunch of other files
02:44
whole zip
157.87 MB
02:44
another 150 mb kekw
Avatar
unpickled admin bot 04/30/2023 2:44 AM
we love that data
02:45
ok wtf
02:45
i got told
02:45
these were weights
Avatar
i have no idea what it is
Avatar
unpickled admin bot 04/30/2023 2:45 AM
chall desc lies 😭
02:46
unless they are some weird way of storing weights
Avatar
It looks like the binary is trying to run a neural network on a specific input, but it doesn't really work. At least the weights and the input are still there.
02:46
this the hint
Avatar
unpickled admin bot 04/30/2023 2:47 AM
im hurting inside rn
02:47
just
02:47
running this
02:47
feels like a violation of so many rules
02:50
welp who wants to rev the disassembly
02:50
(joking)
Avatar
i did a bit, there's references to activation functions relu and sigmoid, there are four linear layers dimensions go like this:420 -> 6969 -> 3000 -> 1337 -> 8008 (edited)
02:51
still can't load the data
02:51
nor the biases
Avatar
Avatar
TheBadGod
i did a bit, there's references to activation functions relu and sigmoid, there are four linear layers dimensions go like this:420 -> 6969 -> 3000 -> 1337 -> 8008 (edited)
unpickled admin bot 04/30/2023 2:52 AM
tyty
02:53
ok problem tho which ones are relu which ones are sigmoid (edited)
02:53
they can be diff ones right
Avatar
Avatar
TheBadGod
i did a bit, there's references to activation functions relu and sigmoid, there are four linear layers dimensions go like this:420 -> 6969 -> 3000 -> 1337 -> 8008 (edited)
unpickled admin bot 04/30/2023 2:56 AM
do you know where these refs/activation functions are in the sequence? (edited)
Avatar
uuh, i do not
Avatar
unpickled admin bot 04/30/2023 2:56 AM
kk
Avatar
but i know that we crash because the deserialization fails
Avatar
Avatar
TheBadGod
_pickle.UnpicklingError: A load persistent id instruction was encountered,
unpickled admin bot 04/30/2023 2:57 AM
this means we need to pass a function to handle returning objects for persistent ids
02:57
but
02:57
we (presumably) dont know the objects
02:57
which is great (in a sarcastic way) (edited)
Avatar
idk why it should not work
03:05
like i did pickletools.dis and it disassembled fine, don't see custom things
Avatar
Avatar
TheBadGod
idk why it should not work
unpickled admin bot 04/30/2023 3:06 AM
google says we need to define a persistent_load function that takes a persistent id and passes back the object (edited)
03:06
???????
03:06
we could do the good old lambda _:1 tbh lmao
03:06
and just see how it goes
03:07
idt that is intended tho
Avatar
Avatar
TheBadGod
idk why it should not work
unpickled admin bot 04/30/2023 3:10 AM
I've made a pickle file using the following. from PIL import Image import pickle import os import numpy import time trainpixels = numpy.empty([80000,6400]) trainlabels = numpy.empty(80000) validp...
03:11
could the author have loaded in each weight/bias one at a time or smthng? (edited)
03:12
While trying to load the model given here, I'm facing the following problem: ..... ..... File "/home/sid/text-segmentation/evaluate.py", line 13, in load_model model = torch.load(f) F...
Avatar
I think the weights are just the files 0,2,4,6
03:12
biases are files 1,3,5,7
Avatar
Avatar
TheBadGod
I think the weights are just the files 0,2,4,6
unpickled admin bot 04/30/2023 3:12 AM
but wtf is that format
03:12
like
Avatar
floats
03:13
raw
Avatar
unpickled admin bot 04/30/2023 3:13 AM
oh
Avatar
the file sizes would match
03:13
so the first file (0) has exactly 6969420\4 bytes (edited)
03:13
file 1 has 6969*4 bytes (which would be biases)
Avatar
unpickled admin bot 04/30/2023 3:14 AM
hmm
03:14
then the last thing we prob need
03:14
is
Avatar
9 has 38 floats though
Avatar
Avatar
TheBadGod
i did a bit, there's references to activation functions relu and sigmoid, there are four linear layers dimensions go like this:420 -> 6969 -> 3000 -> 1337 -> 8008 (edited)
unpickled admin bot 04/30/2023 3:14 AM
when are relu and sigmoid called?
03:14
in the sequence
Avatar
Avatar
unpickled admin bot
when are relu and sigmoid called?
i think relu, sigmoid, relu, relu, relu
03:16
obv only 2 relus at the end (edited)
03:16
there is an extra relu though
Avatar
unpickled admin bot 04/30/2023 3:17 AM
gimme a minute lmao (edited)
Avatar
lol
Avatar
unpickled admin bot 04/30/2023 3:18 AM
ok it finished
03:18
lets goooo
Avatar
ok there is another layer, with 38 inputs, which matches with the file 9
03:20
so it actually is (420) -> relu (6969) -> sigmoid (3000) -> relu (1337) -> relu (8008) -> relu (38)
Avatar
unpickled admin bot 04/30/2023 3:20 AM
welp no sending stuff.pkl
Avatar
Avatar
TheBadGod
so it actually is (420) -> relu (6969) -> sigmoid (3000) -> relu (1337) -> relu (8008) -> relu (38)
unpickled admin bot 04/30/2023 3:21 AM
tyty
03:21
wait whats the output though
Avatar
good question
03:25
ok got it
03:25
in the binary
03:26
inputs=[51.0,56.0,50.0,66.0,18.0,96.0,67.0,80.0,95.0,43.0,68.0,17.0,72.0,48.0,95.0,20.0,61.0,74.0,66.0,89.0,67.0,31.0,97.0,84.0,78.0,96.0,84.0,94.0,13.0,41.0,46.0,42.0,50.0,63.0,4.0,57.0,25.0,73.0,74.0,40.0,7.0,52.0,23.0,18.0,66.0,74.0,10.0,71.0,98.0,91.0,30.0,79.0,60.0,71.0,1.0,7.0,11.0,58.0,81.0,7.0,19.0,32.0,4.0,48.0,79.0,29.0,55.0,24.0,76.0,28.0,25.0,80.0,55.0,32.0,19.0,90.0,91.0,40.0,29.0,6.0,60.0,29.0,94.0,84.0,93.0,23.0,89.0,30.0,81.0,41.0,64.0,54.0,58.0,97.0,23.0,1.0,10.0,34.0,69.0,93.0,77.0,16.0,35.0,13.0,61.0,2.0,48.0,28.0,83.0,14.0,66.0,10.0,10.0,42.0,37.0,89.0,70.0,30.0,37.0,30.0,76.0,25.0,86.0,8.0,40.0,55.0,83.0,88.0,89.0,81.0,72.0,92.0,47.0,25.0,24.0,18.0,11.0,55.0,29.0,40.0,21.0,89.0,72.0,54.0,66.0,45.0,2.0,24.0,5.0,33.0,88.0,91.0,68.0,22.0,88.0,95.0,70.0,69.0,79.0,82.0,72.0,28.0,93.0,11.0,6.0,18.0,61.0,38.0,62.0,52.0,72.0,67.0,33.0,18.0,95.0,21.0,38.0,44.0,41.0,90.0,83.0,92.0,60.0,45.0,18.0,89.0,63.0,20.0,83.0,5.0,68.0,82.0,5.0,21.0,94.0,10.0,64.0,68.0,77.0,50.0,69.0,25.0,35.0,5.0,8.0,93.0,9.0,4.0,3.0,83.0,38.0,96.0,55.0,63.0,61.0,7.0,61.0,59.0,89.0,75.0,57.0,22.0,82.0,75.0,7.0,43.0,78.0,75.0,86.0,89.0,60.0,27.0,59.0,48.0,70.0,26.0,88.0,92.0,81.0,44.0,10.0,63.0,36.0,10.0,46.0,8.0,64.0,32.0,9.0,29.0,31.0,22.0,95.0,62.0,40.0,3.0,81.0,9.0,42.0,39.0,16.0,33.0,78.0,55.0,61.0,77.0,78.0,7.0,58.0,95.0,10.0,15.0,44.0,16.0,67.0,33.0,84.0,45.0,71.0,11.0,60.0,40.0,29.0,62.0,92.0,10.0,22.0,55.0,36.0,89.0,17.0,62.0,32.0,34.0,76.0,22.0,10.0,45.0,16.0,82.0,60.0,49.0,45.0,71.0,29.0,65.0,9.0,72.0,46.0,6.0,13.0,44.0,25.0,12.0,14.0,82.0,16.0,56.0,75.0,97.0,13.0,20.0,93.0,18.0,77.0,57.0,2.0,77.0,63.0,58.0,96.0,28.0,16.0,34.0,63.0,52.0,10.0,15.0,36.0,63.0,82.0,10.0,11.0,35.0,37.0,86.0,34.0,2.0,95.0,50.0,27.0,93.0,91.0,10.0,88.0,34.0,80.0,99.0,46.0,43.0,33.0,6.0,6.0,89.0,91.0,42.0,55.0,68.0,22.0,7.0,2.0,26.0,64.0,66.0,28.0,45.0,4.0,84.0,66.0,55.0,1.0,41.0,40.0,49.0,83.0,34.0,78.0,48.0,17.0,15.0,11.0,6.0,83.0,15.0,14.0,6.0,80.0,2.0,1.0,60.0,30.0,88.0,77.0,32.0,54.0,56.0,3.0,90.0,7.0,37.0,85.0,42.0,45.0,69.0,90.0,57.0,15.0,90.0,95.0,40.0]
03:26
so challenge solved, right? Kappa
Avatar
unpickled admin bot 04/30/2023 3:31 AM
from torch import nn, tensor data_format = "<f" import struct import numpy as np model = nn.Sequential( nn.Linear(420, 6969), nn.ReLU(), nn.Linear(6969, 3000), nn.Sigmoid(), nn.Linear(3000, 1337), nn.ReLU(), nn.Linear(1337, 8008), nn.ReLU(), nn.Linear(8008, 38), ) biases = [] for x in [str(x) for x in range(1,9,2)]: data = open(x,"rb").read() floats = np.array(list(struct.iter_unpack(data_format, data)), dtype=float) biases.append(tensor(floats)) weights = [] for x in [str(x) for x in range(0,9,2)]: data = open(x,"rb").read() floats = np.array(list(struct.iter_unpack(data_format, data)), dtype=float) weights.append(tensor(floats)) for x in range(0, int(len(model)/2)): model[2*x].bias = nn.Parameter(biases[x]) model[2*x].weights = nn.Parameter(weights[x])
03:31
loads stuff in
Avatar
Avatar
TheBadGod
so challenge solved, right? Kappa
unpickled admin bot 04/30/2023 3:31 AM
maybe but you see its 4:31 am
03:31
so imma do a thing called
03:31
go sleep
Avatar
understandable, good night
03:32
or morning
Avatar
modified the script that it at least spits out numbers, they seem wrong though
Avatar
Avatar
TheBadGod
used /ctf solve
✅ Challenge solved.
Avatar
how tf does the members work in the solve command
05:24
entered quasar, but didn't take it for some reason
05:24
final solve script
3.87 KB
05:25
flag was UMDCTF{i_love_revving_torch_binaries!}
Exported 110 message(s)